Learning Distributed Representations and Deep Embedded Clustering of Texts

نویسندگان

چکیده

Instructors face significant time and effort constraints when grading students’ assessments on a large scale. Clustering similar is unique effective technique that has the potential to significantly reduce workload of instructors in online large-scale learning environments. By grouping together assessments, marking one assessment cluster can be scaled other allowing for more efficient streamlined process. To address this issue, paper focuses text proposes method reducing by clustering assessments. The proposed involves use distributed representation transform texts into vectors, contrastive improve distinguishes differences among texts. presents general framework includes label representation, K-means, self-organization map algorithms, with objective improving performance using Accuracy (ACC) Normalized Mutual Information (NMI) metrics. evaluated experimentally two real datasets. results show maps K-means algorithms Pre-trained language models outperform different

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Distributed Representations of Texts and Entities from Knowledge Base

We describe a neural network model that jointly learns distributed representations of texts and knowledge base (KB) entities. Given a text in the KB, we train our proposed model to predict entities that are relevant to the text. Our model is designed to be generic with the ability to address various NLP tasks with ease. We train the model using a large corpus of texts and their entity annotatio...

متن کامل

Learning Deep Representations for Graph Clustering

Recently deep learning has been successfully adopted in many applications such as speech recognition and image classification. In this work, we explore the possibility of employing deep learning in graph clustering. We propose a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs k-means algorithm on the embedding to obtain cluster...

متن کامل

Learning Deep Representations By Distributed Random Samplings

In this paper, we propose an extremely simple deep model for the unsupervised nonlinear dimensionality reduction – deep distributed random samplings. First, its network structure is novel: each layer of the network is a group of mutually independent k-centers clusterings. Second, its learning method is extremely simple: the k centers of each clustering are only k randomly selected examples from...

متن کامل

compactifications and representations of transformation semigroups

this thesis deals essentially (but not from all aspects) with the extension of the notion of semigroup compactification and the construction of a general theory of semitopological nonaffine (affine) transformation semigroup compactifications. it determines those compactification which are universal with respect to some algebric or topological properties. as an application of the theory, it is i...

15 صفحه اول

Deep Learning of Representations

Unsupervised learning of representations has been found useful in many applications and benefits from several advantages, e.g., where there are many unlabeled exemples and few labeled ones (semi-supervised learning), or where the unlabeled or labeled examples are from a distribution different but related to the one of interest (self-taught learning, multi-task learning, and domain adaptation). ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Algorithms

سال: 2023

ISSN: ['1999-4893']

DOI: https://doi.org/10.3390/a16030158